30 research outputs found

    MEANS AND AVERAGING ON RIEMANNIAN MANIFOLDS

    Get PDF
    Processing of manifold-valued data has received considerable attention in recent years. Standard data processing methods are not adequate for such data. Among many related data processing tasks finding means or averages of manifold-valued data is a basic and important one. Although means on Riemannian manifolds have a long history, there are still many unanswered theoretical questions about them, some of which we try to answer. We focus on two classes of means: the Riemannian LpL^{p} mean and the recursive-iterative means. The Riemannian LpL^{p} mean is defined as the solution(s) of a minimization problem, while the recursive-iterative means are defined based on the notion of Mean-Invariance (MI) in a recursive and iterative process. We give a new existence and uniqueness result for the Riemannian LpL^{p} mean. The significant consequence is that it shows the local and global definitions of the Riemannian LpL^{p} mean coincide under an uncompromised condition which guarantees the uniqueness of the local mean. We also study smoothness, isometry compatibility, convexity and noise sensitivity properties of the LpL^{p} mean. In particular, we argue that positive sectional curvature of a manifold can cause high sensitivity to noise for the L2L^{2} mean which might lead to a non-averaging behavior of that mean. We show that the L2L^{2} mean on a manifold of positive curvature can have an averaging property in a weak sense. We introduce the notion of MI, and study a large class of recursive-iterative means. MI means are related to an interesting class of dynamical systems that can find Riemannian convex combinations. A special class of the MI means called pairwise mean, which through an iterative scheme called Perimeter Shrinkage is related to cyclic pursuit on manifolds, is also studied. Finally, we derive results specific to the special orthogonal group and the Grassmannian manifold, as these manifolds appear naturally in many applications. We distinguish the 22-norm Finsler balls of appropriate radius in these manifolds as domains for existence and uniqueness of the studied means. We also introduce some efficient numerical methods to perform the related calculations in the specified manifolds

    Riemannian consensus for manifolds with bounded curvature

    Full text link
    Consensus algorithms are popular distributed algorithms for computing aggregate quantities, such as averages, in ad-hoc wireless networks. However, existing algorithms mostly address the case where the measurements lie in Euclidean space. In this work we propose Riemannian consensus, a natural extension of existing averaging consensus algorithms to the case of Riemannian manifolds. Unlike previous generalizations, our algorithm is intrinsic and, in principle, can be applied to any complete Riemannian manifold. We give sufficient convergence conditions on Riemannian manifolds with bounded curvature and we analyze the differences with respect to the Euclidean case. We test the proposed algorithms on synthetic data sampled from the space of rotations, the sphere and the Grassmann manifold.This work was supported by the grant NSF CNS-0834470. Recommended by Associate Editor L. Schenato. (CNS-0834470 - NSF

    Gradient Flow Based Matrix Joint Diagonalization for Independent Componenet Analysis

    Get PDF
    In this thesis, employing the theory of matrix Lie groups, we develop gradient based flows for the problem of Simultaneous or Joint Diagonalization (JD) of a set of symmetric matrices. This problem has applications in many fields especially in the field of Independent Component Analysis (ICA). We consider both orthogonal and non-orthogonal JD. We view the JD problem as minimization of a common quadric cost function on a matrix group. We derive gradient based flows together with suitable discretizations for minimization of this cost function on the Riemannian manifolds of O(n) and GL(n).\\ We use the developed JD methods to introduce a new class of ICA algorithms that sphere the data, however do not restrict the subsequent search for the un-mixing matrix to orthogonal matrices. These methods provide robust ICA algorithms in Gaussian noise by making effective use of both second and higher order statistics

    Approximate joint diagonalization with Riemannian optimization on the general linear group

    Get PDF
    International audienceWe consider the classical problem of approximate joint diagonalization of matrices, which can be cast as an optimization problem on the general linear group. We propose a versatile Riemannian optimization framework for solving this problem-unifiying existing methods and creating new ones. We use two standard Riemannian metrics (left-and right-invariant metrics) having opposite features regarding the structure of solutions and the model. We introduce the Riemannian optimization tools (gradient, retraction, vector transport) in this context, for the two standard non-degeneracy constraints (oblique and non-holonomic constraints). We also develop tools beyond the classical Riemannian optimization framework to handle the non-Riemannian quotient manifold induced by the non-holonomic constraint with the right-invariant metric. We illustrate our theoretical developments with numerical experiments on both simulated data and a real electroencephalographic recording

    A Novel Non-Orthogonal Joint Diagonalization Cost Function for ICA

    Get PDF
    We present a new scale-invariant cost function for non-orthogonal joint-diagonalization of a set of symmetric matrices with application to Independent Component Analysis (ICA). We derive two gradient minimization schemes to minimize this cost function. We also consider their performance in the context of an ICA algorithm based on non-orthogonal joint diagonalization

    Average consensus on Riemannian manifolds with bounded curvature

    No full text
    Abstract — Consensus algorithms are a popular choice for computing averages and other similar quantities in ad-hoc wireless networks. However, existing algorithms mostly address the case where the measurements live in a Euclidean space. In this paper, we propose distributed algorithms for averaging measurements lying in a Riemannian manifold. We first propose a direct extension of the classical average consensus algorithm and derive sufficient conditions for its convergence to a consensus configuration. Such conditions depend on the network connectivity, the geometric configuration of the measurements and the curvature of the manifold. However, the consensus configuration to which the algorithm converges may not coincide with the Fréchet mean of the measurements. We thus propose a second algorithm that performs consensus in the tangent space. This algorithm is guaranteed to converge to the Fréchet mean of the measurements, but needs to be initialized at a consensus configuration. By combining these two methods, we obtain a distributed algorithm that converges to the Fréchet mean of the measurements. We test the proposed algorithms on synthetic data sampled from manifolds such as the space of rotations, the sphere and the Grassmann manifold. I
    corecore